Goto

Collaborating Authors

 molmo model


A tiny new open-source AI model performs as well as powerful big ones

MIT Technology Review

Meanwhile, Ai2 says a smaller Molmo model, with 7 billion parameters, comes close to OpenAI's state-of-the-art model in performance, an achievement it ascribes to vastly more efficient data collection and training methods. What Molmo shows is that open-source AI development is now on par with closed, proprietary models, says Ali Farhadi, the CEO of Ai2. And open-source models have a significant advantage, as their open nature means other people can build applications on top of them. The Molmo demo is available here, and it will be available for developers to tinker with on the Hugging Face website. Other large multimodal language models are trained on vast data sets containing billions of images and text samples that have been hoovered from the internet, and they can include several trillion parameters.